Brooklyn Dodger, another blogger on occupational and public health topics, alerts us to a NIOSH survey of the causes of occupational deaths. According to this study, 1997 US mortality data, the estimated annual burden of occupational disease mortality resulting from selected respiratory diseases, cancers, cardiovascular disease, chronic renal failure, and hepatitis is 49,000. At the same time, the Bureau of Labor Statistics estimates there are about 6,200 work-related injury deaths annually. If you look carefully at the distribution of topics addressed by the OSHA standards, you would get the impression that the greater emphasis on occupational injuries means that they are more severe than occupational diseases. However, the data indicate that doesn’t seem to be the case, if you are looking at death on the job.
Note that this doesn’t factor in occupational illnesses and injuries that don’t kill workers. Examination of the statistics will still probably find that skin diseases/injuries and musculoskeletal injuries are still the more predominant workplace hazards.
These are pertinent when you think about the fact that 12 years ago OSHA’s rule on airborne contaminants, which would have reduced Permissible Exposure Limits (PELs) for hundred of substances, had been set aside by a federal Court of Appeals. OSHA tried to “speed along” the process of revising PELs by "generic" rulemaking, grouping substances into 18 categories by their primary health response, such as neurotoxicity, sensory irritation, and cancer (the backup for OSHA’s revised PELs proposed in their 1989 airborne contaminants rule can be found here). OSHA’s reasoning for this approach was that developing standards on a chemical-by-chemical basis would require several years. The court rejected that reasoning, stating that each PEL had to stand on its own merits.
OSHA had been sued by the AFL-CIO, which means that the labor organization was trying to push the agency towards more scientifically-based, and presumably lower PELs. However, the outcome has been virtually no motion by OSHA to update its PELs, many which are still based on occupational exposure guidelines developed prior to 1968. It is unlikely that OSHA will move forward on this except in response to litigation or prompting by Congress (I heard OSHA representatives say as much at an EPA conference a few years ago). The current OSHA rulemaking for a revised PEL for hexavalent chromium is in response to a lawsuit by the Public Citizen Health Research Group.
In today’s environment where ample information exists regarding workplace hazards from chemical exposure, the absence of scientifically up-to-date standards should not be an obstacle for employers to prevent harmful exposures or control them to the lowest level feasible. Is it really excusable for employers to not take steps to keep employee exposures as low as feasible, even if occupational exposure standards haven’t kept pace with current scientific knowledge about health hazards?
Also, when serious hazards are obvious, controls should be implemented even if there is no quantitative evaluation of exposure or harm. For example, it should be obvious these days not to do something like this with any solvent, particularlytrichloroethylene:
Until fall 2003, workers in a Wilkes-Barre special-education school district gave little thought to the chemical they knew only as deglazing solvent.
Used to clean ink from two printing presses in the district's main administration building, the solvent routinely spilled onto the carpet. Its stench drifted through the air ducts. Still, it seemed nothing more than an annoyance -- until Antoinette Dominick was diagnosed with cancer.
Dominick is among roughly two dozen employees who have been diagnosed with non-Hodgkin's lymphoma, lupus, or other diseases that may be associated with TCE exposure -- the solvent used for cleaning, exposing approximately 200 people who worked in the building for upwards of three decades. Though there is still some quibbling about the nature and strength of the evidence, there is enough suspicion, disseminated publicly, that trichloroethylene is carcinogenic to warn even the most poorly-informed employer not to do something like that. You can argue all you’d like that we don’t “know” that TCE is a human carcinogen, and that it takes a long time to develop scientifically-based occupational exposure standards, and that if the government doesn’t pay attention to this issue, that’s a sign that employers shouldn’t have to either. It’s still inexcusable that these things are happening in the 21st century.
The other day, I took MSNBC to task, and by inference, other mainstream media outlets, for stating that the “National Academy of Sciences raises by 20 times the amount of rocket fuel pollution in drinking water considered ‘safe'. . .”. At the time, I was just being snippy over journalistic cluelessness in general about safety and health risk concepts with regard to chemical exposure.
Unfortunately, that had been written right after I had committed the same error (which was taking the NAS Reference Dose to calculate a drinking water action level using the default 2 L/day drinking water rate and 70 kg body weight for adults, not correcting for intake from other sources such as diet, not addressing potentially sensitive populations such as women of childbearing age and newborns. . . . and so forth). As the Environmental Working Group rightly points out, the NAS said no such thing, and the process from going from a Reference Dose to a drinking water standard is a little more complex than depicted by nearly everyone, myself included.
It serves me right for trying to get something out in response out to a “hot” news story.
Four employees of the Michigan-based healthcare firm Weyco have lost their jobs after refusing to be drug-tested – for nicotine. Weyco has introduced a non-smoking policy that extends even away from the workplace.
On Tuesday, an appeals court Tuesday revived part of a class-action lawsuit blaming McDonald's for making people fat. The court reinstated the claims pertaining to deceptive advertising.
The judge who had dismissed the lawsuit in 2003 said it failed to link the children's alleged health problems directly to McDonald's products. The appeals judges said New York's general business law requires a plaintiff to show only that deceptive advertising was misleading and that the plaintiff was injured as a result. The panel upheld other parts of the dismissal.
In dismissing the case, the judge had ruled that consumers could not blame McDonald's if they choose to eat at its fast-food restaurants. He said, "[i]f a person knows or should know that eating copious orders of supersized McDonald's products is unhealthy and may result in weight gain," Sweet had written, "it is not the place of the law to protect them from their own excesses." True enough. But if you watch “Supersize Me”, you discover that McDonalds has done a poor job of displaying nutritional information for its products.
As this unfolds, I’m sure we’ll hear about frivolous lawsuits that threaten to destroy American business, along with renewed calls for tort reform. However, if you’re not going to have robust regulatory agencies, tort liability still needs to be around. It would be interesting to find out more about McDonalds advertising practices (such as whether or not they target kids, just like the tobacco manufacturers are alleged to do). As Steve Gilliard comments, “trials are a good way to find out things we don’t know. . .”.
Courtesy of Effect Measure, we get a story about how the compliant Bush Administration has permitted utility industry lobbyists to draft portions of proposed regulations for control of mercury emissions from coal-fired power plants. Senator Patrick Leahy, who is asking EPA for an explanation, has some of the documents showcasing industry’s involvement with the rulemaking (let’s hear it for eDocket!) on his web site.
Previously, EPA rulemaking would have required 1,100 coal- and oil-fired power plants to meet an emissions standard that sharply reduced mercury emissions – with a three year compliance timeframe. Industry organizations strongly criticized the proposed rule, saying they would be excessively costly and impossible to meet with existing technology. The Bush Administration set those regulations aside and proposed instead a "cap and trade" program, similar what has been used to control power plant emissions that produce acid rain. That plan would reduce mercury emissions by nearly 70 percent, but with a 2018 compliance date. It would let utilities buy emission credits from cleaner-operating plants to meet an overall industry target without having to install controls on every power plant. The cap and trade program is part of the “Clear Skies” initiative, and a better name for an industry-friendly air pollution control program couldn’t have been found. However, we’re told, amazingly enough in the Washington Monthly, that Clear Skies isn’t half-bad.
It must be an interesting place, the parallel universe where all this stuff is coming from. It can’t be coming from this Earth, where mercury concentrations have been increasing worldwide, particularly in the past few decades, including in Arctic ecosystems that are well away from any emissions sources; where 300,000 newborns per year in the U.S. may have had placed at an increased risk of adverse neurodevelopmental effects as a result of in utero methyl mercury exposure; and, where women have to limit the amount of fish they eat to the equivalent of a paltry two cans of tuna a week, so as to not put their fetuses at risk from mercury exposure. Even with significant, very rapid reductions in power plant emissions, we’re going to have mercury exposures with us for a very long time. It’s difficult to see how “Clear Skies” is going to have any effect at all on global mercury cycling or human exposure to mercury.
Washington Monthly is going to have to publish a lot of articles by Philip Longman to make up for this one.
Officials in the city of San Francisco are considering a proposal to tax plastic grocery bags. Paper bags also would be taxed in fairness. According to a study by the city’s Environmental Department, plastic grocery sacks cause recycling machines to jam and litter the city streets. The 50 million plastic and paper grocery sacks used each year in San Francisco cost the city $8.5 million in cleanup and other costs, or about 17 cents per bag. The purpose for the 17 cent per bag tax would be to encourage people to reuse bags or turn to reusable bags – it is anticipated that stores, who would be responsible for paying the tax, would pass the cost onto shoppers.
Not everyone is thrilled with the idea. The American Plastics Council argues that the energy savings would be trivial, compared with other forms of consumption such as driving motor vehicles. The California Grocery Association argues that the tax would be a burden on consumers. I can see their point – a tax promoting recycling and resource conservation is the enemy of economic prosperity. In another light, it would be wise to consider carefully what we wish for, because a movement to a truly sustainable economy will probably involve terrible short-term economic dislocations. But it is still better than the alternative. Besides, we have enough perverse subsidies as is.
This might be an interesting idea for promoting environmental or public health education. Game-playing may have advantages over traditional teaching tools. Games such as role-playing games hold the potential for learning by letting players take on new personas, explore alternatives and solve problems. Games also might hold the promise of helping to engage people better in a topic – the theory being that playing “Rome Total War” could encourage you to start reading about the Roman Empire.
The U.S. Army thinks there is promise in this approach, using freeware such as “America’s Army” and “Full Spectrum Warrior” (the latter also being a training tool) as recruiting aids.
“Games to Teach” is a product of the Massachusetts Institute of Technology (MIT) Comparative Media Studies department. Examples of games that it sponsors include “Biohazard” a game involving an epidemiologist tracking down a disease outbreak and “Environmental Detectives”, which allows the players to assume the roles of various stakeholders investigating and correcting a hazardous materials spill to a river. The U.S. Department of Energy uses SimSite, an interactive training tool for learning the Data Quality Objectives (DQO) process for hazardous waste site investigation. Imagine how much more interesting a tool such as SimSite could be if it looked like this.
Let’s hope this trend can attract some funding and take off in environmental education.
There is no doubt that deaths and property damage from fires have been reduced by the introduction of brominated fire retardants (BFRs). The tradeoff has been that residues of these substances have bioaccumulated through the foodchain and are now found in the blood and breast milk of humans. There are indications that exposures have been increasing over time. And, they are getting a lot of attention in the environmentalhealth community.
There has been movement occurring (see here and here) to remove certain BFRs (principally polybrominated diphenyl ethers – PBDEs) from the product life cycle. However, millions of pounds of these substances already have been introduced into commerce, and the potential exposures to these substances are widespread (see here and here).
While the evidence for adverse effects associated with exposure to PBDEs is principally in laboratory animals, it is worrisome enough for arguments to limit human exposure. Phasing out use of PBDEs is one step, but given the long environmental persistence of these substances, don’t expect that step to have any effect on human exposure for years or even decades. Individuals may have to take their own measures to reduce personal exposure. For myself, I feel there is some hope that managing your microenvironment could have some impact in reducing persistent organic pollutant (POP) exposure and body burden.
As yet, there isn’t a comprehensive program for helping people reduce their personal exposure to POPs, though it appears that the matter has been studied. Research also has been conducted on the use of indoor dust as a metric for exposure, so it is probably a small step from exposure assessment to exposure mitigation. Diet is also a primary exposure pathway for POP exposure, particularly for diets high in animal fats, so eating better also might have an exposure reduction benefit. Lastly, using the power as consumers to creating disincentives for products with POPs in their lifecycle and creating incentives for “green” alternatives is only a long-term solution, but in the end, could be the most effective one. For many, this sounds like a real pain in the ass, and it probably is. But it is also probably time to face these kinds of problems directly, if we wish to forestall the collapse.
Triggers for asthma attacks can include environmental stressors such as cigarette smoke, cockroaches, dust mites, mold, animals, pollen, cold air, exercise, stress, and respiratory infections.
Indoor cleaning to reduce levels of allergens is one of the interventions recommended to reduce asthma attacks. However, a paper published last year in Thorax reported that indoor volatile organic compound concentrations (benzene, ethylbenzene and toluene were mentioned) were risk factors for asthma attacks. The abstract doesn’t discuss the potential sources for the VOC emissions (the paper can be downloaded for a fee), however news reports speculated that VOC emission sources indoors included domestic and cleaning products.
The Soap and Detergent Association (SDA) was in a bit of a lather over this issue. After investing in studies about the effectiveness of cleaning and cleaning products in reducing allergen loads indoors, and developing a resource guide, the last thing the industry wanted to hear was that cleaning products could cause asthma too. SDA asserted that such a study sends the wrong message about cleaning and asthma.
That response might have been a bit overwrought. A search in NLM’s Household Products Database indicates that, for the three VOCs mentioned, the preponderance of products are engine and parts cleaners, spray paints and pesticide formulations. They are also constituents in gasoline and cigarette smoke. All that this shows is that there may be more complexities to managing the indoor environment with regard to asthma, than just keeping down the allergens.
There seems to be a lot of attention to this issue of VOC exposures and asthma.
Environmental Health Tools – Canadian Environmental Modeling Center
I was looking around for some resources to build a toxicokinetic model in Excel for ingestion exposures to persistent organic pollutants, when I happened upon the Canadian Environmental Modeling Center at Trent University. CEMC is organized around Dr. Donald Mackay who is best known for development of fugacity-based multimedia fate and transport models. Fugacity models potentially provide a way to understand in a graded fashion (from simple to more complex phenomena) how chemicals behave and are transported through the environment.
For those interested in seeing more about multimedia modeling, here is a link to CEMC’s EQC or EQuilibrium Criterion model. EQC uses chemical-physical properties to quantify a chemical's behavior under specific environmental conditions. You need to supply your own physical-chemical properties, but most of those can be found here and here.
Posting has been light because of many, many tasks at work this week. Next week should be better.
All of the usualsuspects must be crowing now that the CDC is going to formally revise downward its estimates of obesity deaths, miscalculated due to an arithmetic error. Time to break out the deep-fried Snickers bars, I suppose.
I’ll bet that even Steven Milloy owns a treadmill, though.
Washington D.C. is not a healthy city. According to an article recently published in the Washington Post, it suffers from the highest cancer death rate for men and women, one of the highest rates of AIDS, and syphilis and gonorrhea infection rates that are more than three times the U.S. average. Its infant mortality rate is among the worst in the country, and life expectancy for black males is less than 60 years.
The “good news” for Washington D.C. is that it fares better when benchmarked against other big cites rather than the other 50 states in the union. However, it points to the worrisome fact that health in our big cities is not uniformly good. This is analyzed in the “Big Cities Health Inventory, 2003: The Health of Urban USA”, published by the Chicago Department of Public Health (CDPH). This report presents comparisons of leading measures of health for the 47 largest cities in the U.S.
According to the Big Cities Health Inventory, almost a third of the United States population lived in metropolitan areas with at least 5,000,000 residents, in 2000. These areas were among the fastest growing, with an 11% increase from 1990.
At the same time, residents of large cities are at greater risk of morbidity and mortality than residents in suburban and rural areas, particularly among minority populations. The increased health risks are associated with indicators such as access to quality medical care, socioeconomic status and discrimination. Other factors such as income disparity and uneven distribution of social and economic resources are also significantly related to poor health outcomes. Poverty poses a risk to human health.
Public health status could be an indicator of more worrisome trends in big cities. Along the same lines as a “failed state”, I’ve been recently introduced to the concept of a “feral city”. The “feral city” would be a metropolis with a population of more than a million people where regional, state or national government has lost the ability to maintain the rule of law within the city’s boundaries – yet the city incongruously remains a functioning actor in the greater international system. In a feral city social and health services are all but nonexistent, yet, the city does not descend into total chaos. Control over various portions would be exerted by criminal gangs, armed resistance groups, clans, tribes, or neighborhood associations.
As befits an essay published by the Naval War College, this topic has been explored to understand the challenges to the U.S. military (of course, we’re watching those challenges being played out today in Iraq). Beyond being a magnet for terrorist groups and criminal gangs, feral cities would be potential sites for pandemics and massive environmental degradation.
This might be is a speculative tale, but perhaps it represents a warning that failing to improve public health indicators in our large cities could be a first step to going feral.
David Pollard’s commentary on Jared Diamond’s new book, “Collapse” is well worth reading. He summarizes several other reviews, including one by Malcolm Gladwell in the New Yorker. Gladwell’s review is the gem, especially when he writes:
“The lesson of “Collapse” is that societies, as often as not, aren’t murdered. They commit suicide: they slit their wrists and then, in the course of many decades, stand by passively and watch themselves bleed to death.”
According to Gladwell, Diamond uncovers an underlying assumption that survival as a culture and a society corresponds to biological survival:
“The fact is, though, that we can be law-abiding and peace-loving and tolerant and inventive and committed to freedom and true to our own values and still behave in ways that are biologically suicidal. The two kinds of survival are separate.”
Supposedly, Diamond remains optimistic that we’ll navigate the crisis, though Pollard notes that the book’s message belies that optimism.
“. . .we are doomed to stay loyal to our culture to the bitter end, against all reason, and contrary to our instincts. Cultures just change too slowly, and our current one has 30,000 years of baggage attached to it, way too much acquisition-and-population momentum and I-can't-hear-you-la-la-la inertia to respond to Diamond's urgings for quick, citizen-driven action, even if that action is, some day, forthcoming.”
Diamond insists that the solutions to this crisis involve the cooperation of businesses, because they along with governments are the most potent forces in the world today. His view is that the public has the responsibility to pass the laws, and to make the purchasing decisions that will encourage businesses to behave better. Diamond is probably correct, though Pollard remains pessimistic that this could ever come about. Pollard might have a point, there.
Head over to Pollard’s blog and have a read. It’s very thought provoking stuff.
Roger Pielke (who writes in Prometheus) cites a National Research Council (NRC) report stating that presidential appointments to science and technology committees not be asked about their political or policy viewpoints because these do not predict an expert’s perspective on a particular policy.
He seems to be arguing that politics can’t be segregated from the empaneling and working of expert committees. His proposed solution is “to focus our attention on developing transparent, accountable and effective processes to manage politics in science -- not to pretend that it doesn't exist”.
Chris Mooney questions how this would prevent politicians from stacking of panels during the empaneling process.
After some thought, I come down on Chris’s side on this. I’m a fan of transparency, but how much do people really pay attention to who these experts associate with, who they are getting funding from, what positions they have on different issues in their disciplines? If people in a transparent society start paying close attention to these issues, and see that there is no balance in expert committees, they could eventually become cynical about expertise altogether.
Expert panels depend on their credibility and politicians would do well to think twice about the implications of stacking expert committees, just to get outcomes that benefit their sponsors; unless they’re interested in wrecking science altogether.
MSNBC’s story yesterday on the NAS perchlorate report said, “[a] new report from the National Academy of Sciences raises by 20 times the amount of rocket fuel pollution in drinking water considered ‘safe'. . .”, a statement that glosses over a myriad of important distinctions in its imprecision.
But it evokes such drama! It has conflict. . . and protagonists. . . and narrative. . .
The National Academy of Science Perchlorate Report
The National Academy of Sciences (NAS) came out yesterday with its report on the health risks from perchlorate in drinking water. Perchlorate is an oxidizer used in solid rocket propellant. Perchlorate leaches readily through soil, and is a groundwater contaminant in many locations in the U.S. Millions of people are potentially exposed to low levels of perchlorate in their drinking water. There is the concern that low-level perchlorate exposure may result in neurodevelopmental effects in young children from disruption of thyroid function. Regulatory agencies have labored for years to estimate perchlorate levels in water that would not be associated with adverse health effects in humans. In 2002, the Environmental Protection Agency (EPA) prepared a health risk assessment, which concluded that a level of one part-per-billion (ppb) in drinking water would not be associated with adverse health effects in humans. That assessment was subsequently challenged by the Department of Defense (DOD) and members of the defense industry, and eventually ended up with the NAS. The NAS’s report was just published (a prepublication version can be downloaded here).
The NAS committee was charged with assessing the state of the science with regard to thyroid toxicity of perchlorate, evaluating the animal toxicity studies used to assess human health risks and, based on this review, determine if the findings in EPA’s risk assessment for perchlorate were consistent with the current scientific evidence.
As yet, I’ve only skimmed through the report (it’s over 200 pages), in order to get this posted in a timely manner. I skipped to the end to see how EPA’s risk assessment fared. The NAS committee disagreed with EPA’s use of animal toxicology data showing changes in the thyroid gland as the no observed adverse effect level (NOAEL) for developing the Reference Dose (a short definition of the RfD is here). It recommended that the inhibition of iodide uptake by the thyroid in humans, a key biochemical event, and a predecessor to adverse effects as the basis for the RfD. It recommended using, in combination with other evidence, a short-term ingestion exposure study involving healthy adult men and women, which identified a no-observed-effect level (NOEL) for inhibition of iodide uptake by the thyroid. Based on this study, the NAS committee recommended a RfD of 0.0007 mg/kg-day, which if you do the arithmetic comes out to something a little more than 20 ppb in drinking water. The EPA’s 2002 risk assessment estimated a RfD of 0.00003 mg/kg-day, corresponding to approximately 1 ppb in drinking water. I'll need to post more later about the reasoning underlying the committee's recommendation.
This should be interesting. Last month, in a story published in the Riverside (Inland Southern California) Press-Enterprise, it was reported that one of the authors of the RfD study recommended by the NAS committee had rewritten a news article on perchlorate, which had been submitted by a free lance writer and published in the same issue of Environmental Health Perspectives as the RfD study. The story had been rewritten to portray perchlorate risks as a less significant public health concern. It’s an interesting account, and the P-E kindly provides “before” and “after” versions of the free lancer’s article (free registration required). While it's possible that this was a standard practice, it’s the sort of thing that lends credence to what some people say about risk assessment.
Even before the report had been issued, the NRDC challenged that the NAS panel had been influenced by the White House, the Defense Department and defense contractors. The scope of that influence allegedly included lobbying of the White House by defense contractors, lobbying of the NAS by DOD and defense contractors, DOD and White House involvement in drafting the charge to the NAS panel, and the seating of panel members with views sympathetic to the DOD and the defense industry.
Sorting all of this out is going to take some time. FOIA requests to the White House and DOD by NRDC resulted in a harvest of documents. For example, in discussing the lobbying of the NAS committee, NRDC’s expectation appears to have been that the committee should have been sequestered like a jury. At this point, it’s hard to know if the meetings and submittal of documents to the committee by DOD and industry members represent the normal interplay during an expert review, or lobbying. I’m no real judge in these matters, and I welcome the views from anyone who has served on expert panels at this level. However, the redactions and refusal to produce documents by the DOD and White House in response to the FOIA request, as reported by NRDC, are troubling at the very least.
Expert scientific panels are not really practicing science, and do not make regulatory policy. However, because they operate at that interface between science, policy and regulation, the stakeholder interaction (industry lobbying, NGO activism) will inevitably be part of the process, and not on a level playing field either. The one thing underpinning the assessment of environmental health risks is the credibility of that process. Individual citizens can’t judge whether or not the rat bioassays or epidemiological studies were conducted properly, or were characterized appropriately in risk assessments. Even when the information is presented in detail, such as in the NAS report, you’re pretty much taking it at face value. The outcome won’t be viewed as trustworthy if the process isn’t perceived as transparent and credible. Maybe David Brin has a point with his views about the transparent society. Stay tuned as this story evolves.
Postscript: the Environmental Working Group (EWG) views the NAS report as an endorsement for a low drinking water action level for perchlorate. That’s a point that bears further examination. Depending on what is the distribution of drinking water exposures, there may not be a lot if difference between 1 and 20 ppb, or it could represent a step-function in potential health risks.
In “Supersize Me”, filmmaker Morgan Spurlock subjects himself to a diet of nothing but food from McDonald's for 30 days, with dramatic results. He gains a lot of weight, his cholesterol level skyrockets, his liver begins to show signs of injury, and he experiences a range of symptoms, including headaches, mood swings, symptoms of addiction, and depressed libido. It was a bit of a gimmick, even if he was under medical supervision.
However, someone has recently published a study in the Lancet (free registration required to look at the abstract), that has systematically evaluated the “Supersize Me” phenomena:
Mark A Pereira, Alex I Kartashov, Cara B Ebbeling, Linda Van Horn, Martha L Slattery, David R Jacobs Jr, David S Ludwig. 2005. Fast-food habits, weight gain, and insulin resistance (the CARDIA study): 15-year prospective analysis. Lancet. 365(9453): 36-42.
The key findings from the abstract are that fast food could increase risk for obesity and diabetes through 1) excessive portion size with single large meals often approaching or exceeding individual daily energy requirement; 2) palatability that emphasizes primordial taste preferences for sugar, salt and fat; and 3) high glycemic load. The high glycemic load and trans-fatty acid content also might enhance the risk of diabetes through mechanisms unrelated to energy balance. I’m looking forward to reading the paper.
In a follow up to the stories about the Graniteville, South Carolina hazardous materials incident that left 8 (now 9) dead and scores injured, the New York Times reports how ten months ago, government safety officials reported that more than half of the nation's 60,000 pressurized rail tank cars did not meet industry standards, and that there were concerns about the safety of the remainder of the tank car fleet as well.
Following a bit of detective work, I traced the quote to this source (would it have killed the Times to provide a link in their article?). It is an accident report prepared by the National Transportation Safety Board (NTSB) of a rail tank car derailment in Minot, North Dakota in January 2002. Five tank cars carrying liquefied compressed anhydrous ammonia, a liquefied compressed gas catastrophically ruptured, releasing a vapor plume. About 11,600 people occupied the area affected by the vapor plume. The incident resulted in one fatality, 11 people serious injuries, and over 300 minor injuries.
The NTSB reported that the shells of the five tank cars that catastrophically failed were built before 1989 and were fabricated from “non-normalized” steel, which was more brittle and susceptible to cracking. This was thought to result in an instantaneous release of the 146,700 gallons of anhydrous ammonia within moments of the derailment in Minot, which produced a much larger and more concentrated plume of ammonia gas than would have occurred if the same quantity had been released more slowly, allowing the ammonia to dissipate gradually in the atmosphere.
To address the problem of brittle, more easily fractured steel, the shells of pressure tank cars since 1989 been required to be fabricated from “normalized” steel. Without going into the metallurgy, this increases ductility and fracture toughness in the steel plate. According to the NTSB, the American Association of Railroads (AAR) standard for tank car shells called for using normalized steel in 1988.
During the public hearing on the accident, industry representatives and railroad regulators (including the U.S. Federal Railroad Administration – FRA) all stated that the pressure tank cars constructed before 1989 were safe and possessed a good safety record, that the catastrophic brittle failures seen in the Minot derailment were rare and that the tank cars would also have failed and released their cargoes of anhydrous ammonia anyway even if they had been made with the more crashworthy steel.
I’m quoting the following from the report (with emphasis added), because it needs to be seen to be believed:
During its public hearing on the Minot accident, the Safety Board explored possible options to reduce the risks posed by pre-1989 pressure tank cars. However, representatives from the FRA and tank car manufacturers raised various objections to each of these options based on concerns about the expense, questionable safety benefits, and new risks that might develop if existing operating procedures were changed for the railroads and shippers.
Neither the FRA nor industry representatives have offered a resolution to the issue of pre-1989 cars other than acknowledging the need to better understand the forces acting on tank cars during derailments and ranking the existing pre-1989 tank car fleet to identify the tank cars with the highest risk. Regarding the ranking of the pre-1989 pressure cars, no specific ideas were offered on how to accomplish such a ranking.
Approximately 60 percent of pressure tank cars currently in service were built before 1989 and very likely were constructed from non-normalized steel. Additionally, tank cars may remain in service for up to 50 years, which means that the last pressure tank cars constructed of non-normalized steel could remain in service until 2039. Further, according to AAR statistics, there were more than 1.23 million tank car shipments of hazardous materials in 2000 (the last year for which data are available) in the United States and Canada. Of the top ten hazardous materials transported by tank car, five were class 2 liquefied compressed gases (LPG, anhydrous ammonia, chlorine, propane, and vinyl chloride) that together accounted for more than 246,600 tank car shipments, or about 20 percent of all hazardous materials shipments by tank car.
Consequently, the Safety Board is concerned about the continued transportation of class 2 hazardous materials in pre-1989 tank cars. Because of the high volume of liquefied gases transported in these tank cars and the cars’ lengthy service lives, the Safety Board concludes that using these cars to transport DOT class 2 hazardous materials under current operating practices poses an unquantified but real risk to the public.
In a tank car crash in Texas last summer, the tank car that ruptured and released chlorine gas was made before 1989, though federal investigators have not yet concluded whether brittle steel played a role in that accident. The South Carolina crash involved the rupture of a newer tank car manufactured in 1993, according to its owner, the Olin Corporation.
The federal government’s glacial response to this problem isn’t terribly surprising (the FRA is sponsoring studies that should be completed in the next few years). What does surprise me is that emergency managers and first responders aren’t yelling long and loud about this. Also, wouldn’t better tank car survivability provide a homeland security benefit?
From the BBC news via the Left Coaster, we learn that the giant Chinese energy company, CNOOC plans to offer $13 billion for the purchase of Unocal. I think the resource wars are starting to heat up a bit more.
By now, you’ve probably heard the story of Thursday’s railroad accident resulting in a chlorine spill in Graniteville, South Carolina, that resulted in 8 dead, hundreds injured and evacuation of a town of 5,000 (I found it over at Confined Space).
Jordan at Confined Space also draws our attention to a story back in October 2004 stating that, despite the heightened post 9-11 awareness of hazardous materials transport, the railroad industry has resisted efforts by the government to re-route hazardous materials rail transport around densely populated areas. In particular, the Bush administration has rebuffed efforts by the District of Columbia city council to implement measures that would require railroads to re-route hazardous materials shipments around populated areas, or at least provide vulnerability assessments and notifications for shipments. Instead, the federal government favors voluntary measures worked out in cooperation with the industry.
As reported in Govexec.com, the latest concrete sign of progress of the effort to secure hazardous materials shipments was a notice that appeared in the Federal Register in August 2004. The Departments of Transportation (DOT) and Homeland Security (DHS) published the notice to solicit comments on the feasibility of steps to reduce the risk posed by shipments of substances that are toxic by inhalation. The notice is 8 pages long, densely worded and a hard read. An interesting way to cut to the chase though is to go the e-docket and look at the comments (look for Docket No. 18730 by the RPSA – Research and Special Projects Administration of the DOT; use the “advanced search” feature).
Amidst all of the pleas from industries not to place any further requirements on them, you find some gems, including comments from the states of West Virginia and California, who note that the DOT/DHS measures include a proposal to remove hazardous material placards from commercial transportation carriers, because these create the potential for use by terrorists to identify hazardous materials as potential weapons. The states rightly point out the difficulties created for first responders in managing hazardous materials incidents, as well as the health and safety risks created for those same first responders. Comments by the International Association of Emergency Managers emphasize the need to be able to see hazmat placards from a safe distance and initiate appropriate response actions when necessary. Initiating emergency sheltering-in-place or evacuation procedures for the public and avoiding unprotected responder exposure has saved lives and avoided injuries; failing to initiate these actions appropriately has caused deaths and injuries. The IEM comments acknowledged that protecting the public from the effects of terrorist actions is a real need, but stressed that changes to procedures related to hazmat placards must consider the potential for a greater negative impact on the public from the more likely events of a hazardous materials incident. Am I reading this right? Our federal government is proposing steps that would actually increase risks to the public from hazardous materials incidents, in an effort to reduce the much rarer risk of a terrorist incident?
Revere has started up a weekly feature profiling one of the blogs on Effect Measure’s blogroll, and his first candidate is Impact Analysis! As a token of my appreciation, I’ll get on the bandwagon and introduce you to How to Save the World, published by David Pollard. He’s a Canadian management consultant who publishes a blog with a distinctly environmental flavor and a strong emphasis on creation and propagation of knowledge in business enterprises. The quantity and quality of content on this blog is awe-inspiring and I wonder where he finds the time to create it. My favorite bits are the flow charts (check it out and you’ll see what I mean).
Also, be sure to regularly click on two of my favorites, Effect Measure for public health commentary and Confined Space for workplace health and safety issues.
Environmental Health Tools – Education Initiative from EHP
The journal Environmental Health Perspectives (EHP) is conducting a one-year pilot project of creating and distributing environmental health science lessons based on news articles published in EHP. The lesson plans, which are at a high-school level, are aligned with National Science Education Standards for various disciplines including biology, chemistry, environmental science, geology, and physical science.
While doing some research for an earlier benzene post, I rediscovered a good resource, Late Lessons from Early Warnings: the Precautionary Principle 1896–2000, published by the European Environmental Agency (EEA) and available free of charge. It presents accounts of cases where early warnings of impending or environmental harm were ignored, representing an advertisement for use of the Precautionary Principle in environmental regulation. A range of cases are considered including destruction of the Californian sardine industry through overfishing, the mesothelioma epidemic resulting from asbestos exposure, groundwater contamination with the gasoline additive MTBE, "mad cow disease", and carcinogenic risks from benzene exposure.
A review in the online edition of Occupational and Environmental Medicine published in 2002, provides a summary of the report, along with a concise discussion of some of the problems associated with implementing the precautionary principle in environmental regulations.
The Center for Environment, Health and Justice has come out with a report that points to potential health and environmental threats from polyvinyl chloride (PVC). The report details the environmental burden associated with PVC, including the usual suspects, dioxin emissions during open burning or incineration, groundwater contamination and the filling of municipal solid waste landfills. I’ll quibble about the hazard evaluation some other time, but of greater value were the discussions of alternatives to PVC, and tips for organizing and being activists.
Jared Diamond wrote about how Europe used firearms, biological warfare brought on through population density and technological innovation to conquer tribal peoples throughout much of the world ("Guns, Germs and Steel: The Fate of Human Societies"). He’s recently published another book "Collapse: How Societies Choose to Fail or Succeed" that examines ancient societies that perished, and societies that have persisted to these times. Since I’ve just heard about it, I haven’t read it yet. It’s on my reading list, though.
Using a multidisciplinary approach, he examines the extinction of four ancient societies, Easter Island in Polynesia, the Anasazi tribe in the southwestern United States, the Mayan civilization, and Viking settlements on the coast of Greenland. In each of the cases analyzed, a principal cause of extinction was ecocide, or unintentional ecological suicide. However, environmental degradation by itself did not result in extinction. Diamond concludes that a society’s fate in response to these crises is determined by how well its leaders and citizens anticipate problems before they become crises, and how decisively a society responds. However, as the Christian Science Monitor’s review says, “[s]uch factors may seem obvious, yet Diamond marshals overwhelming evidence of the short-sightedness, selfishness, and fractiousness of many otherwise robust cultures. He reveals that many leaders were (and are) so absorbed with their own pursuit of power that they lost sight of festering systemic problems.”
Diamond links the analyses of ancient societies with modern examples, including Somalia, Rwanda, Haiti, China, and Australia, as well as in Montana, a state that once was among the wealthiest in the nation but now struggles with poverty, environmental problems and population loss.
This isn’t the first time I’ve heard the message of trying to anticipate and respond to wide-ranging environmental problems. It will be interesting to see the linkages between the lessons in “Collapse” and the lessons from the precautionary principle. “Collapse” provides a much needed message, in these times when our currently elected leaders do not appear to be very interested in the lessons of history.
I found on Riskworld a news item of an analysis of 13 case-control studies of indoor radon exposure and lung cancer incidence from nine European countries, to be published in the British Medical Journal. Recall that a case control study is a retrospective comparison of exposures of persons with a disease (cases) versus persons without the disease (controls), and is one of the more robust study methods in epidemiology (some resources on epidemiology are here and here). The authors concluded that collectively, though not separately, these 13 studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers.
The authors stratified smokers into a separate group (i.e. examined cancer incidence and radon exposure separately between smokers and non-smokers). After stratification of smokers, there was strong evidence of an association between indoor radon concentrations and lung cancer. The dose-response relation seemed to be linear, with no evidence of a threshold dose; evidence of a dose-response relationship is an important factor in toxicology and epidemiology in judging that an environmental hazard can produce adverse health effects. A linear dose-response relationship is considered to be a characteristic feature of radiation carcinogenesis (we’ll talk about hormesis some other time).
The absolute risk to smokers and recent ex-smokers was much greater than to lifelong non-smokers. In the absence of other causes of death, the absolute risks of lung cancer by age 75 years at usual radon concentrations of 0, 100, and 400 Bq/m3 (0, 2.7 and 10.8 pCi/L for we Americans) would be about 0.4%, 0.5%, and 0.7%, respectively, for lifelong non-smokers, and about 25 times greater (10%, 12%, and 16%) for cigarette smokers. Radon in the home accounts for about 9% of deaths from lung cancer and about 2% of all deaths from cancer in Europe. The authors note that the findings of this analysis were consistent with comparable studies in North America and China.
More information on radon can be found here, including U.S. Environmental Protection Agency’s (EPA) risk assessment on radon, and a neat radon hazard map of the U.S.
There are some interesting asides from this issue. First, when figuring out priorities for risk reduction, radon, or other volatile indoor air contaminants such as PCE, seem to be less “popular” than cleaning up volatile organic compounds (VOCs) as groundwater contaminants or outdoor toxic air pollutants. Indeed, one of the hottest issues in managing hazardous waste sites is indoor vapor intrusion of VOCs from soil or groundwater contamination. Vapor intrusion gains a lot of press, because it has protagonists, controversy and a narrative. It’s hard to find someone else to blame for indoor radon, and harder to blame ourselves for PCE exposure from bringing dry-cleaned clothes home. Although, what would be a good story is why efforts to commercialize “green” alternatives to PCE for dry-cleaning are so feeble. Second, it shows how disconnected are the efforts to manage the various forms of environmental risks. There appears to be as yet no meaningful effort to measure and rank workplace exposure to carcinogens, indoor air contaminants such as radon, dietary carcinogens, global pollutants (POPs), hazardous waste contaminants, toxic air pollutants, etc., even though your body doesn’t particularly care where the chemical insult is coming from. All of these risks have their own constituencies jockeying for precedence and their own regulatory frameworks that are often inconsistent and conflicting. EPA’s and OSHA’s disparate risk management philosophies for carcinogens represent one glaring example of this problem. EPA has made a stab at developing a framework for performing cumulative risk assessments, which is a step in the right direction. However, it seems to largely sit on the shelf, with little or no movement to translate it into a usable regulatory framework, and no incentive to do so under the Bush administration.